Deep learning with transfer functions: new applications in system identification

نویسندگان

چکیده

Abstract This paper presents a linear dynamical operator described in terms of rational transfer function, endowed with well-defined and efficient back-propagation behavior for automatic derivatives computation. The enables end-to-end training structured networks containing functions other differentiable units by exploiting standard deep learning software. Two relevant applications the system identification are presented. first one consists integration prediction error methods learning. is included as last layer neural network order to obtain optimal one-step-ahead error. second considers general block-oriented models from quantized data. These constructed combining operators static nonlinearities feed-forward networks. A custom loss function corresponding log-likelihood output observations defined. For gradient-based optimization, computed applying algorithm through whole network. benchmarks used show effectiveness proposed methodologies.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multimodal Transfer Deep Learning with Applications in Audio-Visual Recognition

We propose a transfer deep learning (TDL) framework that can transfer the knowledge obtained from a single-modal neural network to a network with a different modality. Specifically, we show that we can leverage speech data to fine-tune the network trained for video recognition, given an initial set of audio-video parallel dataset within the same semantics. Our approach first learns the analogyp...

متن کامل

Deep Transfer Learning for Person Re-identification

Person re-identification (Re-ID) poses a unique challenge to deep learning: how to learn a deep model with millions of parameters on a small training set of few or no labels. In this paper, a number of deep transfer learning models are proposed to address the data sparsity problem. First, a deep network architecture is designed which differs from existing deep Re-ID models in that (a) it is mor...

متن کامل

Coresets For Monotonic Functions with Applications to Deep Learning

Coreset (or core-set) in this paper is a small weighted subset Q of the input set P with respect to a given monotonic function f : R → R that provably approximates its fitting loss ∑ p∈P f(p · x) to any given x ∈ R. Using Q we can obtain approximation to x∗ that minimizes this loss, by running existing optimization algorithms on Q. We provide: (i) a lower bound that proves that there are sets w...

متن کامل

Deep Transfer Learning with Joint Adaptation Networks

Deep networks rely on massive amounts of labeled data to learn powerful models. For a target task short of labeled data, transfer learning enables model adaptation from a different source domain. This paper addresses deep transfer learning under a more general scenario that the joint distributions of features and labels may change substantially across domains. Based on the theory of Hilbert spa...

متن کامل

Asymmetric Transfer Learning with Deep Gaussian Processes

We introduce a novel Gaussian process based Bayesian model for asymmetric transfer learning. We adopt a two-layer feed-forward deep Gaussian process as the task learner of source and target domains. The first layer projects the data onto a separate non-linear manifold for each task. We perform knowledge transfer by projecting the target data also onto the source domain and linearly combining it...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IFAC-PapersOnLine

سال: 2021

ISSN: ['2405-8963', '2405-8971']

DOI: https://doi.org/10.1016/j.ifacol.2021.08.395